Automatic human affect recognition research is to enable a computer to understand human affective behavior through sensors, with a goal of building human-centered and affect-support computing environment. This talk includes four parts.
The state of the art
Automated analysis of human affective behavior has attracted increasing attention from researchers in multiple disciplines. I will present a picture of this research through our recent comprehensive survey (accepted in PAMI) where we examine available approaches and databases, focusing on two trends in the field: analysis of spontaneous affective behavior and multimodal analysis of human affective behavior.
Multimodal fusion for audiovisual affect recognition
Many studies from both psychological and engineering research fields have demonstrated the importance of integration of information from multiple modalities to yield a coherent representation and inference of emotions. I will present our efforts toward multimodal affect recognition, focusing on Multi-stream Fused Hidden Markov Model which is to model multimodal fusion according to the maximum entropy principle and maximum mutual information criterion.
Multi-view facial expression recognition
The ability to handle multi-view facial expressions is important for computers to understand affective behavior under less constrained environment. I will present our investigation on using various feature extraction methods, classifiers, and the classifier fusion for multi-view facial expression recognition. In meantime, we explored an interesting question: whether non-frontal-view facial expression analysis can achieve the same as or better performance than the existing frontal-view facial expression studies.
Spontaneous affective expression analysis
I will present our initial efforts toward recognition of audiovisual affective expressions occurring in a psychological human interaction setting—the Adult Attachment Interview (AAI). This work scratches the surface of complexity of automatic spontaneous affect recognition, which will lead to the discussion of challenges in my talk.